40 research outputs found

    Emotion-aware computing using smartphone

    Get PDF
    In this project, we address the problem to determine human emotion states automatically using modern day smartphones. Sensor-rich smartphones have opened up the opportunity to unobtrusively collect user behaviour patterns, activity details and infer information about emotion states. Determining human emotion accurately and efficiently to build a scalable system is the major objective of the project. Towards that goal, we plan to develop an emotion detection model leveraging on different information sources present on smartphone. Emotion awareness is finding its applicability in various applications in digital marketing, healthcare, advertising, personalized learning, and in designing novel user interfaces. We discuss two specific scenarios Quantified self and Affective learning, where this can be useful

    Understanding Psycholinguistic Behavior of predominant drunk texters in Social Media

    Full text link
    In the last decade, social media has evolved as one of the leading platform to create, share, or exchange information; it is commonly used as a way for individuals to maintain social connections. In this online digital world, people use to post texts or pictures to express their views socially and create user-user engagement through discussions and conversations. Thus, social media has established itself to bear signals relating to human behavior. One can easily design user characteristic network by scraping through someone's social media profiles. In this paper, we investigate the potential of social media in characterizing and understanding predominant drunk texters from the perspective of their social, psychological and linguistic behavior as evident from the content generated by them. Our research aims to analyze the behavior of drunk texters on social media and to contrast this with non-drunk texters. We use Twitter social media to obtain the set of drunk texters and non-drunk texters and show that we can classify users into these two respective sets using various psycholinguistic features with an overall average accuracy of 96.78% with very high precision and recall. Note that such an automatic classification can have far-reaching impact - (i) on health research related to addiction prevention and control, and (ii) in eliminating abusive and vulgar contents from Twitter, borne by the tweets of drunk texters.Comment: 6 pages, 8 Figures, ISCC 2018 Workshops - ICTS4eHealth 201

    Towards improving emotion self-report collection using self-reflection

    Get PDF
    In an Experience Sampling Method (ESM) based emotion self-report collection study, engaging participants for a long period is challenging due to the repetitiveness of answering self-report probes. This often impacts the self-report collection as participants dropout in between or respond with arbitrary responses. Self-reflection (or commonly known as analyzing past activities to operate more efficiently in the future) has been effectively used to engage participants in logging physical, behavioral, or psychological data for Quantified Self (QS) studies. This motivates us to apply self-reflection to improve the emotion self-report collection procedure. We design, develop, and deploy a self-reflection interface and augment it with a smartphone keyboard-based emotion self-report collection application. The interface

    Designing an experience sampling method for smartphone based emotion detection

    Get PDF
    Smartphones provide the capability to perform in-situ sampling of human behavior using Experience Sampling Method (ESM). Designing an ESM schedule involves probing the user repeatedly at suitable moments to collect self-reports. Timely probe generation to collect high fidelity user responses while keeping probing rate low is challenging. In mobile-based ESM, timeliness of the probe is also impacted by user's availability to respond to self-report request. Thus,

    Smart-phone based spatio-temporal sensing for annotated transit map generation

    Get PDF
    City transit maps are one of the important resources for public navigation in today's digital world. However, the availability of transit maps for many developing countries is very limited, primarily due to the various socio-economic factors that drive the private operated and partially regulated transport services. Public transports at these cities are marred with many factors such as uncoordinated waiting time at bus stoppages, crowding in the bus, sporadic road conditions etc., which also need to be annotated so that commuters can take informed decision. Interestingly, many of these factors are spatio-temporal in nature. In this paper, we develop CityMap, a system to automatically extract transit routes along with their eccentricities from spatio-temporal crowdsensed data collected via commuters' smart-phones. We apply a learning based methodology coupled with a feature selection mechanism to filter out the necessary information from raw smart-phone sensor data with minimal user engagement and drain of batt

    Impact of experience sampling methods on tap pattern based emotion recognition

    Get PDF
    Smartphone based emotion recognition uses predictive modeling to recognize user's mental states. In predictive modeling, determining ground truth plays a crucial role in labeling and training the model. Experience Sampling Method (ESM) is widely used in behavioral science to gather user responses about mental states. Smartphones equipped with sensors provide new avenues to design Experience Sampling Methods. Sensors provide multiple contexts that can be used to trigger collection of user responses. However, subsampling of sensor data can bias the inference drawn from trigger based ESM. We investigate whether continuous sensor data simplify the design of ESM. We use the typing pattern of users on smartphone as the context that can trigger response collection. We compare the context based and time based ESM designs to determine the impact of ESM strategies on emotion modeling. The results indicate how different ESM designs compare against each other

    Does emotion influence the use of auto-suggest during smartphone typing?

    Get PDF
    Typing based interfaces are common across many mobile applications, especially messaging apps. To reduce the difficulty of typing using keyboard applications on smartphones, smartwatches with restricted space, several techniques, such as auto-complete, auto-suggest, are implemented. Although helpful, these techniques do add more cognitive load on the user. Hence beyond the importance to improve the word recommendations, it is useful to understand the pattern of use of auto-suggestions during typing. Among several factors that may influence use of auto-suggest, the role of emotion has been mostly overlooked, often due to the difficulty of unobtrusively inferring emotion. With advances in affective computing, and ability to infer user's emotional states accurately, it is imperative to investigate how auto-suggest can be guided by emotion aware decisions. In this work, we investigate correlations between user emotion and usage of auto-suggest i.e. whether users prefer to use auto-suggest in specific emotion states. We developed an Android keyboard application, which records auto-suggest usage and collects emotion self-reports from users in a 3-week in-the-wild study. Analysis of the dataset reveals relationship between user reported emotion state and use of auto-suggest. We used the data to train personalized models for predicting use of auto-suggest in specific emotion state. The model can predict use of auto-suggest with an average accuracy (AUCROC) of 82% showing the feasibility of emotion-aware auto-suggestion

    Emotion detection from touch interactions during text entry on smartphones

    Get PDF
    There are different modes of interaction with a software keyboard on a smartphone, such as typing and swyping. Patterns of such touch interactions on a keyboard may reflect emotions of a user. Since users may switch between different touch modalities while using a keyboard, therefore, automatic detection of emotion from touch patterns must consider both modalities in combination to detect the pattern. In this paper, we focus on identifying different features of touch interactions with a smartphone keyboard that lead to a personalized model for inferring user emotion. Since distinguishing typing and swyping activity is important to record the correct features, we designed a technique to correctly identify the modality. The ground truth labels for user emotion are collected directly from the user by periodically collecting self-reports. We jointly model typing and swyping features and correlate them with user provided self-reports to build a personalized machine learning model, which detects four emotion states (happy, sad, stressed, relaxed). We combine these design choices into an Android application TouchSense and evaluate the same in a 3-week in-the-wild study involving 22 participants. Our key evaluation results and post-study participant assessment demonstrate that it is possible to predict these emotion states with an average accuracy (AUCROC) of 73% (std dev. 6%, maximum 87%) combining these two touch interactions only

    Unsupervised annotated city traffic map generation

    Get PDF
    Public bus services in many cities in countries like India are controlled by private owners, hence, building up a database for all the bus routes is non-trivial. In this paper, we leverage smart-phone based sensing to crowdsource and populate the information repository for bus routes in a city. We have developed an intelligent data logging module for smartphones and a server side processing mechanism to extract roads and bus routes information. From a 3 month long study involving more than 30 volunteers in 3 different cities in India, we found that the developed system, CrowdMap, can annotate bus routes wit
    corecore